#unity reference prefab in script
Explore tagged Tumblr posts
true-dream-games · 10 months ago
Text
Through Yourself: Dev Log Week 1-Week 3
Due to just making this page I will go over what has been done in the last three weeks:
Completion of version 1.0 of the GDD of 'Through Yourself' completed; ensured that this project is small enough to be done with a small team, but rich in story
Reference images added to GDD
Started Unity project; added a textured ball, a camera, and pill for testing purposes on a flat, colored plane
Created basic scripting for movement (FPSMovement)
Created basic scripting for camera control (FirstPersonCamera)
Created basic scripting for picking up objects (HoldObjects which became HoldingObjects later)
Revised movement and camera control scripts to work better with the placeholder 'Player' object as well as picking up objects
Added Object Rotation to HoldingObjects script; removed old script from textured ball placeholder and added new HoldingObjects script to Player prefab
Tuned object movement when on pick-up to come to the hand smoother; further refined FPSMovement and FirstPersonCamera to more easily work with the new HoldingObjects script
Started work on making objects snap to specific places
I am aware that this is not a whole lot at first glance, but I have a vision, and am determined to see it through. I do hope you all take a chance on this, and watch as it progresses.
---Melodania
20 notes · View notes
spellsword-dev · 9 months ago
Text
Progress with damage system
Quite a big breakthrough for my humble self! After watching a ton of vids on Events and Delegates in Unity I made a basic foundation for handling events like damaging enemies in a way that's modular and (hopefully) scalable. Before, I had an issue with damage being propagated to every copy of an enemy prefab on the map. That sucked, and I researched options of isolating affected enemies from the rest. What seemed like a good option was using Hash Sets - unordered lists of unique objects. Enemy intersects my hitbox and is added to the list. If it leaves the hitbox it is removed from the list. Upon attack only enemies in the list would receive damage. Here's preview of a counter working as intended - player stores number of affected cubes inside the hit radius. (In the console is a scrolling list of named cubes currently entering or exiting the list.)
Tumblr media
Very cool! And also very wrong and unnecessary. The bugs this would eventually create are unfathomable. Didn't help me much, but I keep the hash idea for later. I suspected the issue was in assignment of an object to receive damage, not the hitbox, so I scrapped the HashSet idea. After some fiddling I found solution in slightly changing how damage is applied to the object. Turns out getting reference of affected object via player's attack script worked perfectly. I'm not sure why GetComponent feels like a bad thing, it's an irrational negative bias, but it works. Scan for instances of anything tagged as "enemy", assign it a temporary variable, and make it take damage via event stored in the enemy's script.
Tumblr media
So it seems like it's fixed now (within confines of an empty testing scene). Now I can apply damage only to enemies affected by relevant hitboxes. With this I'll try adding it to my melee and ranged modes - slashing with sword and blasting spells. I've also added kill counter and sum of dealt damage to this prototype for fun:
I want to make a presentable prototype soon :)
2 notes · View notes
admitonegame · 2 years ago
Text
Admit One Dev Blog: Update 1
Did some work todaaay on Laaabour Dayyy Today I did some work on learning how use animation notifications and animation blending to allow the Player to draw an equipped weapon in Unreal Engine. One of the learning resources I've been using is this Udemy course here (if you're interest in grabbing it, it might worth while to wait until it's on sale, since Udemy sales occur pretty frequently). While it's likely I'll rip out and replace much of what I'm doing with this tutorial series (or have to reconnect a lot of things once I have custom art and animations) I still feel like running through this series has been helpful since I'm pretty new to Unreal (much more experienced in Unity and another now defunct game engine) and have been learning a lot.
This leads me to: ================================================ THE UNREAL LEARNING CORNER: Part 1 Whenever this segment shows up here I'll likely talk about some problem I've ran into and explain how I fixed it. None of these will be "be all end all" solutions but hopefully the record of these mishaps will be useful for someone, it will also be a good place to show off any funny bugs I may run into. The things I have today are pretty dry though. LET THE PARENT SKELETAL MESH CONTAIN THE MESH
Tumblr media
So a Actor or Character Blueprint (a kind of prefab in UE that can also contain Visual Scripting Code) can contain a Skeletal Mesh component. In early effort to organize how the Player Character and Customization would be set up (would like there to be Character Customization, no grantee of that in a year though) I wanted to place all Meshs (art/models) underneath one parent in a attempt to segment up where all the clothing and other assets would be placed. I've basically achieved a similar goal by using a "Child Actor" (R and LFoot above). While it remains to be seen if that will completely work, I'm at least confident letting the Parent Mesh actually contain the Player's model prevents these two issues below.
Tumblr media
Unreal when trying to reference the Mesh of a Blueprint will grab the uppermost Mesh in it's personal hierarchy by default, so without that Get Child Component node, there's no way it could actually effect my character in the way I wanted it to, in this case attaching a sword to their side. While kind of annoying to add another node, this wouldn't be so bad, however:
Tumblr media
I couldn't find a way to access the child model and tell it to perform an animation. There may be a way to do this but ultimately just making the parent Skeletal Mesh actually contain the Skeletal Mesh seems to be the way to go so to avoid having to filter through what's supposed to be being affected. TRIGGERED VS STARTED Also you can see above that I'm activating the Anim Montage with the Triggered type of input detection(?), I haven't had the best luck with that just yet, I think it's for actions you're meant to hold down but Ongoing could also be used for that? Unsure. Regardless I recommend using Started for something you just want to hit once like a button press. Using Triggered here was actually causing a variable to change itself four times per button press which resulted in some inconsistent behavior and confusion.
Tumblr media
================================================ So a bit simple today but I'm hoping these will steadily get more exciting as we go along. Thanks for reading!
2 notes · View notes
nareshjuego · 1 year ago
Text
Unity Game Studios: Best Practices for Efficient Game Development
Unity has become one of the most popular game engines for developers of all sizes, offering powerful tools and a flexible workflow that streamline game development. However, maximizing efficiency in game development requires more than just using the right tools; it involves adopting best practices that optimize workflows, enhance collaboration, and ensure high-quality outcomes. In this blog, we will explore best practices for efficient game development using Unity, helping you and your team create games more effectively and efficiently.
Tumblr media
1. Plan and Pre-Production
Define Clear Objectives and Scope
Before diving into development, it’s essential to have a clear understanding of your game’s objectives, target audience, and scope. Define the core mechanics, features, and narrative elements. Create a game design document (GDD) that outlines these aspects in detail. This document will serve as a reference throughout the development process, ensuring everyone on the team is aligned.
Create a Prototype
Building a prototype early in the development process helps validate your game’s core mechanics and concepts. Unity’s rapid prototyping capabilities allow you to quickly test and iterate on your ideas. A prototype can reveal potential issues and provide valuable insights that inform subsequent development stages.
2. Optimize Workflow and Organization
Use Version Control
Implementing version control is crucial for managing your project’s files and codebase. Tools like Git and services like GitHub or Bitbucket enable collaborative development, track changes, and prevent data loss. Unity game studios integrates well with these tools, making it easy to manage and sync project updates across the team.
Organize Your Project Structure
Maintain a well-organized project structure to enhance efficiency and readability. Create a consistent folder hierarchy for assets, scripts, scenes, and other project elements. Unity’s Asset Store offers templates and tools that can help standardize your project structure.
Leverage Prefabs and Asset Reusability
Use prefabs to create reusable game objects and components. Prefabs allow you to manage and update multiple instances of an object from a single source, saving time and reducing errors. Organize your prefabs in a dedicated folder for easy access and management.
3. Optimize Performance
Efficient Asset Management
Optimize your game assets to ensure smooth performance across different platforms. Use appropriate levels of detail (LOD) for models, compress textures, and manage polygon counts. Unity’s Profiler tool can help identify performance bottlenecks related to assets.
Script Optimization
Efficient scripting is key to maintaining performance. Avoid using expensive operations in update loops, and use object pooling to manage frequently instantiated and destroyed objects. Profiling tools in Unity can help monitor and optimize your scripts’ performance.
Manage Memory Usage
Monitor and manage memory usage to prevent crashes and slowdowns. Use Unity’s memory profiler to identify memory leaks and optimize asset loading and unloading. Implement efficient garbage collection practices to minimize runtime memory overhead.
4. Enhance Collaboration
Use Collaboration Tools
Effective collaboration tools are essential for efficient teamwork. Tools like Unity Collaborate, Slack, Trello, and Asana facilitate communication, task management, and project tracking. Unity Collaborate allows team members to sync their work and resolve conflicts easily.
Regular Team Meetings and Reviews
Conduct regular team meetings to discuss progress, address challenges, and plan next steps. Code reviews and playtesting sessions provide opportunities for feedback and quality assurance, ensuring that the project stays on track and maintains high standards.
5. Testing and Quality Assurance
Automated Testing
Implement automated testing to catch bugs and ensure stability. Unit tests, integration tests, and performance tests help verify that your game’s components work as intended. Unity Test Framework and other testing tools can streamline this process.
Regular Playtesting
Frequent playtesting is crucial for identifying gameplay issues and improving user experience. Gather feedback from diverse testers to uncover different perspectives and potential improvements. Iterate on feedback to refine gameplay mechanics and polish the overall experience.
6. Documentation and Knowledge Sharing
Comprehensive Documentation
Maintain thorough documentation for your project, including code comments, development guidelines, and workflow processes. Well-documented projects are easier to manage, especially when onboarding new team members or revisiting the project after a hiatus.
Knowledge Sharing
Foster a culture of knowledge sharing within your team. Regularly share tips, best practices, and lessons learned through meetings, internal forums, or documentation. Encourage team members to contribute to shared knowledge bases and learn from each other’s experiences.
7. Continuous Learning and Improvement
Stay Updated with Unity’s Features
Unity is constantly evolving, with regular updates introducing new features and improvements. Stay informed about the latest developments by following Unity’s official blog, forums, and release notes. Incorporate new features and best practices into your workflow to stay competitive and efficient.
Invest in Training and Development
Encourage continuous learning through training and professional development opportunities. Attend Unity workshops, webinars, and conferences to stay current with industry trends and advancements. Investing in your team’s skills and knowledge pays off in the long run by enhancing overall efficiency and innovation.
Conclusion
Efficient game development with Unity requires a combination of planning, optimization, collaboration, and continuous improvement. By following these best practices, you can streamline your workflows, enhance team productivity, and create high-quality games that stand out in the competitive market. Unity’s powerful tools and supportive community provide a solid foundation, but it’s the application of these best practices that truly revolutionizes the development process. Embrace these strategies to maximize your potential and achieve success in your game development endeavors.
1 note · View note
iat410-wallace · 1 year ago
Text
Blog Post #3 (Feb. 15th)
Over the past 2 weeks, the team began development of the game using Unity, completing a cumulative prototype for the first milestone. This included both exploration and management activities, as well as implementing early visuals for UI.
I was tasked with creating the shop scene which facilitates the management portion of our game. This included the angled top-down view where players move as the avatar-based herbalist, and then interact with a workstation to enter an herb mixing and grinding sequence. An NPC would also walk in to place an order to prompt the player for which herbs they want in their mix.
youtube
My first rendition of this is seen above, where the NPC walks in, places the order with three of the existing herb. The player can interact with a work station, leading to a drag & drop herb activity. Once herbs are in the mortar, the player can press the space bar to simulate grinding the herbs.
youtube
After further development in the first week, I took visuals from the artists Bea & Zynab to implement into the game and added the temporary prototype main menu and tutorial screens. Additionally, while the other developer Ilyas worked on the exploration scene, I made a temporary mini-world to test the collectible herb Prefabs that I made. The grinding activity was reviewed by the team, and we decided to make it more kinaesthetic instead, having the player physically grind the herbs to add immersive simulation. The NPC now also gave a response to the order, and would check whether or not you got it right.
youtube
By the second week, I further finalized the game to prepare for playtesting as I added two additional herb types and random NPCs order. This helped add challenge to the prototype, making it feel more like a complete game. The NPC would return in an endless loop after completing each order with a new prompt each time.
Tumblr media
The last step was combining the scenes I created with the outdoor exploration scene Ilyas created, which utilized the collectible herbs prefabs I made, as well as moving the shop entry point to a temporary graphic by Bea. I ended the week by playtesting, as well as playing around with Timeline in Unity as we will need to create multiple cutscenes to implement narrative beats.
The team has not encountered any issues thus far. We are working collaboratively, as developers are utilizing GitHub to work simultaneously on different scripts and scenes, and artists are creating graphics and designing UI, sharing completed ones to developers. My personal challenge thus far has been learning the Unity interface and functionalities, as it is my first time working with the game engine. With my experience in Java & Processing, I can quickly transfer my coding knowledge to C# and properly use scripts which I learned while coding with Lua. I will continue to develop my skills by self-learning through reference documentation and video tutorials.
For the following weeks, I will improve some interaction bugs, such as keypresses not registering properly, as well as optimizing my existing scripts. I will be further developing the shop scene with updated graphics and starting to define more complex NPC behavior as we make official levels for our game.
0 notes
generalistprogrammer · 5 years ago
Link
0 notes
adamtheamazing5 · 2 years ago
Video
youtube
This project is a testing prototype that I have used to test various mechanics, unity features, data types, etc. Some I test I already knew (e.g. player moves forward and back rotate), others I learned (e.g., loading from “resources” and activating other game objects animators from code respectfully). The goal of the project is to test out unfamiliar essentials to improve my C# programming and general unity engine knowledge. I created this little level with 3d tool kits modular prefabs and particles as a start to testing.
WHAT I AM DEMONSTRATING:
• Player movement and its abilities to shrink and regrow, change color, and shoot randomized size lava balls.
• Raycast mechanics such as spawning rocks, moving the player cube, changing rock colors and detecting certain objects like “items”.
• Each Trigger has a distinctive behavior that will respond when the player enters their trigger such as camera triggers, light trigger, animation trigger, etc.
• Speaking of camera triggers, each camera trigger can be triggered by the player to be idle, follow, move or both (except idle).
• Creating a mesh from code using a certain number of UVs, vertices’, tri’s, etc.
• Loading objects from resources without getting a public reference to the game object.
• Creating an object by code.
• Appling gravity to a game object that they are stuck on a sphere as if they were on a planet like Earth.
• Minimap follows and rotates with the player, also can be expanded fullscreen.
• Particle activates on certain game objects if the player collides with that object.
WHAT I LEARNED:
--Player Mechanics--
• Changing player cube size with transform.local scale by inputs.
• When instantiating lava balls, changing the game object’s name to “sphere” instead of “sphere clone” and create a random range of Vector 3.one size scales: meaning it will spawn each of random sizes.
• Making Lava balls use force mode acceleration to ignore their mass, so it doesn’t slow down or speed up depending on the mass.
• 2 ways to write Vector3.one.
• Changing player cubes colour by <Render> material.color by inputs.
• Able to instantiate dust particles with a certain wall from the player cubes rotation/angle with the quaternion.Identity with its transform. Position.
--Camera manager and Triggers—
• Made two separate scripts that can smoothly follow or look at at the player depending on which camera trigger the player enters. This is done with Vector3s, look at, quaternions, lerp, and slerp depending on either script. It can offset the camera if need too.
• Using Camera triggers to store a reference to the Camera Managers camera states to control which state of the 4 to execute specific code. Depending on the selected public state when the player cube enters the trigger, the camera will switch states ( more info next sentence).
• Creating a camera Manager that controls the current Camera behavior states such as Idle, follow, look at, or both follow and look at by using switch states with functions.
• Creating a minimap for the first time ever by adding a second camera above the player and rendering a certain distance below. The minimap can follow the player cube by moving and rotating with a vector 3 and Quaternion. Eular respectfully.
• Making the minimap expand fullscreen by pressing a button M and shrinking to its default size once M is pressed again using the Camera rect.
• Activating another game objects the animator to play a specific animation bool when the player’s cube enters a trigger referencing that Animator. Also playing a different animation if the player's cube leaves the animator trigger.
• After entering a door Trigger, use a lean tween animation to smoothly open the door downwards for only 2 seconds. Lean tween also bypasses complex datatypes like Lerps.
--Raycasting and Mouse events--
• Raycasting from a camera onto the collider (ground) to get the player cube to move to a new position wherever the player has clicked on while still making sure it's on the same y.position while moving.
• Using the rigid body’s method of movePosition that allows the player cube to move into a new position once the player has clicked on that position.
• Making the player cube follow the mouse cursor to move to a new position on the ground by aiming it from the camera (cam,ScreenPointToRay(inputPosition)), basically by player's cursor.
• How raycasts store info on what they hit.
• How certain game objects can be detected by the raycast when checking its name by hit.collider.name. Also scaling it with transform.Localscale by the same raycast.
• Using an Array of what and storing the data of what the raycast hits based on its game object name in the console and turning certain game objects green if they have a renderer on their parent game object.
• How to spawn rocks from the camera raycast when the mouse is clicked on a collider.
• How Mouse events are not based on cursor design but only on what information the mouse touches or hovers over.
• The difference between OnMouse Enter, Exit, and Down and a few others like drag and over.
--Other Mechanics and misc--
• Lerps and slerps are good for moving game objects without a rigid body.
• The Resources datatype can find game objects in a Resources folder by a Resources.load code and load them into the scene without using a public game object reference.
• Resources.FindObjectsOfTypeAll can be used to locate assets and scene objects.
• Turning the Player cube's gravity off and some constraints when their position is on a sphere (like a planet).
• How Fixed update is good for rigid bodies- particularly for movement.
• How the player's gravity’s attractor can attract on the sphere it updates its movement regardless of moving or not so it does not fall off the sphere.
-Creating Mesh and objects--
• Creating mesh by code by first adding components of Mesh filter, Mesh Renderer, and clearing the mesh empty. Then adding vertices, UV’s, and Triangles by using arrays with Vector 3,s Vector 2s, and ints all by code.
• Creating Quads by code by determining array vertices of vector 3s of quad height and quad weights, tri’s by array ints to set on the quad, normal’s by array vector3’s, UV’s by Vector2’s and then assigning them to the quad.
• Getting the created quad to move back and forth by using a While loop to move its vertices and normals by multiplying with Mathf.Sin (Time. time).
• Creating a new game object by code by first giving it a name, and mesh type such as a sphere, assigning it to a position, then adding rigidbody. All without assigning it in Inspector before pressing play.
WHAT COULD I DO BETTER?
• creating mesh by code was very complicated, if there is a time when I need to use those mechanics again for my future games, I may need to do some more research on how to make all sorts of unique mesh like the default Unity 3D primitives.
• The raycast checker on the player cube detected camera triggers, making it difficult to detect the “item” by raycast, I will need to find out other ways camera triggers can detect the player without getting in the way of the raycast.
• When the dust particle appears from the cube wall collision, it should play in the same rotation position as the cube instead of just one rotation.
• At the moment I could not get the dust particles to instantiate in the right rotation when colliding with certain walls, next time I will need to find out how to make them do so consistently.
OVERALL:
It was an interesting exercise. I got a better understanding of Unity, datatypes, and making mechanics. This will make my game dev journey much easier since I understand the code a lot better. Hopefully, I can do something like this again to know more advanced and expert essentials.
#pc  #GameDesigner  #Mechanic Test  #Unity3D  #gamedevelopment  #gamedev  #UnityEngine  #IndieGames  #AustralianGameDevoloper  #indiedev  #indiegame  #indie  #indiedev  #Gamedev  #IndieGames  #SoleDevoloper  #indiegaming  #IndieGameDev  #indiegames
0 notes
worthyenergylife · 3 years ago
Text
Adobe fuse unity3d materials
Tumblr media Tumblr media
I don't know.īut the numerals 4 and 7 NEVER APPEAR TOGETHER, IN QUOTES, out of quotes, EVER, in this script as a case statement. Is the only way a NUMBER 47 MIGHT APPEAR. Yesterday, I spent 4 hours trying to get a camera to switch from cockpit to flight view, on a simple bool, yes or no., back and forth I went, debug logging, yelling, slamming my keyboard, not realizing, I had set that to always be true, somewhere earlier in my code, I am a idiot sometimes!.anywayĭebug.Log("texBase.format" + texBase.format) It hard, intimidating at first, even after 10 years, I screw up. Scared to open a text editor, and delete a case statement? It is in english, and is telling you what to do! This a free script, its your responsibility, no one else's. You need to learn, the basics of coding 101, you don't know enough, to even debug that? stop, learn, try again. The quoted code is not within my script so I'm at a loss. Your shader looks great so I'm sure it already does the right thing here, on-the-fly, without need to change the texture I have not seen this assertion, so cannot reproduce the problem to attempt a fix. The script does this in MixamoAssetProcessor::OnPostprocessTexture() by setting alpha to (1 - alpha). After the import crashes, the files are still there and can be manually attached to the material, and manually set the material Rendering Mode to Opaque (this works for all texture component on this model except the eyelashes - packing makes it impossible to set the two modes required to make both body and eyelash textures The metallic/smoothness textures need their alpha channel to be inverted to work correctly. The texture expected names are not being found, rather the textures are all now packed into a single texture. It appears you've exported the object, modified the character, and importantly changed the textures - which is breaking the script assumptions. and a single I've looked at the provided gunman.zip and can repro the crash. It really is just "as-is", and is somewhat brittle code that worked only for characters pre-supplied within the Adobe Fuse program (male/female fit/scan 1,2,3. This script wasn't released in any officially supported way, however. by copying the imported character folder to another computer, or pulling commits of characters from Collaborate.
Update to fix problem when textures are reimported e.g.
Updated to correctly handle non-alpha hair.
Updated to handle masks (partially), gloves, shoes, hats.
Set smoothness per material to roughly match visuals in Adobe Fuse CC (beta).
Set texture import settings correctly (alpha channel, normal maps).
Correct the MetallicAndSmoothness map (has inverted alpha).
Create new materials for Eyes and Eyelashes.
Set shader render modes correctly per material.
Most solutions say to simply set the shader render modes to Opaque or Transparent/Fade, but this is only part of the solution this script performs: This results in importing the character to Assets/Mixamo// and a prefab is created that references the updated materials.
Tumblr media
In Unity, click the menu Mixamo -> Import from Fuse with exported Unity textures.
Save the Mixamo FBX file beside the exported textures folder.
Download the rigged character as a Unity FBX file.
From Fuse select File -> Animate with Mixamo.
Export textures for Unity 5 into a new folder with the same name as the character.
Create a model using the Adobe Fuse application.
Now the script should exist as Assets/Mixamo/Editor/MixamoUnityImporter.cs.
Create a folder within Mixamo called Editor.
Within Unity create folder called Mixamo.
Using this script the results are quite close to the quality within the Fuse window e.g.
Tumblr media
The textures within the FBX files are not ideal for use in Unity. It is designed to take a character that was made in Fuse, and uses the textures that Fuse exports that are designed for use with Unity. Note, the attached script is not designed to work with characters downloaded direct from the Mixamo store. Without this script the imported character materials are all broken. I have made a script that imports characters from Adobe Fuse CC (Beta) and makes the necessary modifications to the materials and texture maps.
Tumblr media
0 notes
gameguruguy · 4 years ago
Text
Week 11: Lectures & Readings
It was my final lecture for IGB220 today, which is a bit sad because this has been my favourite unit this semester, but I guess on the bright side, no more driving home in 5pm traffic this semester. Dr. Conroy gave the class a really great review of what is to come and a nice technical tour of some of the software to come. We discussed the likes of unity and unreal and their purpose in the industry and beyond. It was nice to have a bit of exposure to unity and not be totally in the dark, unlike in my coding classes.  
Tumblr media
I got to learn about prefabs and handy actions like linking scripts and making structures. I think I learnt the most from this lecture. I’m currently doing CAB201 (hello coding) and IGB283 (death by math) so I have a basic understanding of arrays and loops, but it was really cool to hear how important they are and can be used in gaming, it kind of boosted my efforts for success in those classes.
I continued on to part 3 of Fullerton’s book this week, I made my way through chapter 12 over a couple of nights. In this chapter Fullerton explored team structures and the roles that individuals can play in the hierarchy. I read about the specifics and various dynamics of publishers and developers, and learnt a lot more about these relationships and companies, who I only previously knew by name, but had no idea what exactly they do. I found the chapter was really helpful for my thought process with Assignment 3, even though we are only a team of 3, it got me thinking about the various roles I need to play in the team as well as the roles that my group partners could take on. It really opened my eyes up to how much is required of game developers, whether it is an indie project or a ‘AAA’ title.
References:
Image retrieved from: 
https://i.chzbgr.com/full/9340630784/h357E9FA4/text-when-you-write-10-lines-of-code-without-searching-on-google-itaint-much-but-its-honest-work
0 notes
wizcorp · 7 years ago
Text
Unite 2018 report
Introduction
A few Wizcorp engineers participated in Unite Tokyo 2018 in order to learn more about the future of Unity and how to use it for our future projects. Unite Tokyo is a 3-day event held by Unity in different major cities, including Seoul, San Francisco and Tokyo. It takes the form of conferences made by various Unity employees around the globe, where they give an insight on some existing or future technologies and teach people about them. You can find more information about Unite here.
In retrospective, here is a summary of what we’ve learned or found exciting, and that could be useful for the future of Wizcorp.
Introduction first day
The presentation on ProBuilder was very interesting. It showed how to quickly make levels in a way similar to Tomb Raider for example. You can use blocks, slopes, snap them to grid, quickly add prefabs inside and test all without leaving the editor, speeding up the development process tremendously.
They made a presentation on ShaderGraph. You may already be aware about it, but in case you’re not, it’s worth checking it out.
They talked about the lightweight pipeline, which provides a new modular architecture to Unity, in the goal of getting it to run on smaller devices. In our case, that means that we could get a web app in something as little as 72 kilobytes! If it delivers as expected (end of 2018), it may seriously compromise the need to stick to web technologies.
They showed a playable web ad that loads and plays within one second over wifi. It then drives the player to the App Store. They think that this is a better way to advertise your game.
They have a new tool set for the automotive industry, allowing to make very good looking simulations with models from real cars.
They are making Unity Hack Week events around the globe. Check that out if you are not aware about it.
They introduced the Burst compiler, which aims to take advantage of the multi-core processors and generates code with math and vector floating point units in mind, optimizing for the target hardware and providing substantial runtime performance improvements.
They presented improvements in the field of AR, typically with a game that is playing on a sheet that you’re holding on your hand.
Anime-style rendering
They presented the processes that they use in Unity to approach as close as possible Anime style rendering, and the result was very interesting. Nothing is rocket science though, it includes mostly effects that you would use in other games, such as full screen distortion, blur, bloom, synthesis on an HDR buffer, cloud shading, a weather system through usage of fog, skybox color config and fiddling with the character lighting volume.
Optimization of mobile games by Bandai Namco
In Idolmaster, a typical stage scene has 15k polygons only, and a character has a little more than that. They make the whole stage texture fit on a 1024x1024 texture for performance.
For post processing, they have DoF, bloom, blur, flare and 1280x720 as a reference resolution (with MSAA).
The project was started as an experiment in April of 2016, then was started officially in January of 2017, then released on June 29th of the same year.
They mentioned taking care about minimizing draw calls, calls to SetPassCall(DrawCall).
They use texture atlases with index vertex buffers to reduce memory and include performance.
They used the snapdragon profiler to optimise for the target platforms. They would use an approach where they try, improve, try again and then stop when it’s good enough.
One of the big challenges was to have lives with 13 people (lots of polys / info).
Unity profiling and performance improvements
This presentation was made by someone who audits commercial games and gives them support on how to improve the performance or fix bugs.
http://github.com/MarkUnity/AssetAuditor
Mipmaps add 33% to texture size, try to avoid.
Enabling read/write in a texture asset always adds 50% to the texture size since it needs to remain in main memory. Same for meshes.
Vertex compression (in player settings) just uses half precision floating points for vertices.
Play with animation compression settings.
ETC Crunch textures are decrunched on the CPU, so be careful about the additional load.
Beware about animation culling: when offscreen, culled animations will not be processed (like disabled), and with non-deterministic animations this means that if disabled, when it’s enabled again, it will have to be computed for all the time where it was disabled, which may create a huge CPU peak (can happen when disabling and then re-enabling an object too).
Presentation of Little Champions
Looks like a nice game.
Was started on Unity 5.x and was then ported on to Unity 2017.x.
They do their own custom physics processes, by using WaitForFixedUpdate from within FixedUpdate. The OnTriggerXXX and OnCollisionXXX handlers are called afterwards.
They have a very nice level editor for iPad that they used during development. They say it was the key to creating nice puzzle levels, to test them quickly, fix and try again, all from the final device where the game is going to be run on.
Machine learning
A very interesting presentation that showed how to teach a computer to play a simple Wipeout clone. It was probably the simplest you could get it (since you only play left or right, and look out for walls using 8 ray casts.
I can enthusiastically suggest that you read about machine learning yourself, since there’s not really room for a full explanation of the concepts approached there in this small article. But the presenter was excellent.
Some concepts:
You have two training methods: one is reinforcement learning (where you learn through rewards, trial and error, super-speed simulation, so that the agent becomes “mathematically optimal” at task) and one is imitation learning (like humans, learning through demonstrations, without rewards, requiring real-time interaction).
You can also use cooperative agents (one brain -- the teacher, and two agents -- like players, or hands -- playing together towards a given goal).
Learning environment: Agent <- Brain <- Academy <- Tensorflow (for training AIs).
Timeline
Timeline is a plugin for Unity that is designed to create animations that manipulate the entire scene based on time, a bit like Adobe Premiere™.
It consists of tracks, with clips which animate properties (a bit like the default animation system). It’s very similar but adds a lot of features that are more aimed towards creating movies (typically for cut scenes). For example, animations can blend among each other.
The demo he showed us was very interesting, it used it to create a RTS game entirely.
Every section would be scripted (reaction of enemies, cut scenes, etc.) and using conditions the track head would move and execute the appropriate section of scripted gameplay.
He also showed a visual novel like system (where input was waited on to proceed forward).
He also showed a space shooter. The movement and patterns of bullet, enemies, then waves and full levels would be made into tracks, and those tracks would be combined at the appropriate hierarchical level.
Ideas of use for Timeline: rhythm game, endless runner, …
On a personal note I like his idea: he gave himself one week to try creating a game using as much as possible this technology so that he could see what it’s worth.
What was interesting (and hard to summarize in a few lines here, but I recommend checking it out) is that he uses Timeline alternatively to dictate the gameplay and sometimes for the opposite. Used wisely it can be a great game design tool, to quickly build a prototype.
Timeline is able to instantiate objects, read scriptable objects and is very extensible.
It’s also used for programmers or game designers to quickly create the “scaffoldings” of a scene and give that to the artists and designers, instead of having them to guess how long each clip should take, etc.
Another interesting feature of Timeline is the ability to start or resume at any point very easily. Very handy in the case of the space shooter to test difficulty and level transitions for instance.
Suggest downloading “Default Playables” in the Asset Store to get started with Timeline.
Cygames: about optimization for mid-range devices
Features they used
Sun shaft
Lens flare (with the use of the Unity collision feature for determining occlusion, and it was a challenge to set colliders properly on all appropriate objects, including for example the fingers of a hand)
Tilt shift (not very convincing, just using the depth information to blur in post processing)
Toon rendering
They rewrote the lighting pipeline rendering entirely and compacted various maps (like the normal map) in the environment maps.
They presented where ETC2 is appropriate over ETC, which is basically that it reduces color banding, but takes more time to compress at the same quality and is not supported on older devices, and this was why they chose to not use it until recently.
Other than that, they mentioned various techniques that they used on the server side to ensure a good framerate and responsiveness. Also they mentioned that they reserved a machine with a 500 GB hard drive just for the Unity Cache Server.
Progressive lightmapper
The presentation was about their progress on the new lightmapper engine from which we already got a video some time ago (link below). This time, the presenter did apply that to a small game that he was making with a sort of toon-shaded environment. He showed what happens with the different parameters and the power of the new lighting engine.
A video: https://www.youtube.com/watch?v=cRFwzf4BHvA
This has to be enabled in the Player Settings (instead of the Enlighten engine).
The big news is that lights become displayed in the editor directly (instead of having to start the game, get Unity to bake them, etc.).
The scene is initially displayed without lights and little by little as they become available textures are updated with baked light information. You can continue to work meanwhile.
Prioritize view option: bakes what's visible in the camera viewport view first (good for productivity, works just as you’d expect it).
He explained some parameters that come into action when select the best combination for performance vs speed:
Direct samples -> simply vectors from a texel (pixel on texture) to all the lights, if it finds a light it's lit, if it's blocked it's not lit.
Indirect samples: they bounce (e.g. emitted from ground, then bounces on object, then on skybox).
Bounces: 1 should be enough on very open scenes, else you might need more (indoor, etc.).
Filtering smoothes out the result of the bake. Looks cartoonish.
They added the A-Trous blur method (preserves edges and AO).
Be careful about UV charts, which controls how Unity divides objects (based on their normal, so each face of a cube would be in a different UV chart for example), and stops at the end of a chart, to create a hard edge. More UV maps = more “facetted” render (like low-poly). Note that for a big number of UV maps, the object will become round again, because the filtering will blur everything.
Mixed modes: normally lights are either realtime or baked.
3 modes: subtractive (subtract shadows with a single color; can appear out of place), shadowmask: bake into separate lightmaps, so that we can recolor them; still fast and flexible, and the most expensive one where all is done dynamically (useful for the sunlight cycle for example), and distance shadowmask uses dynamic only for objects close to the camera, else baked lightmaps.
The new C# Job system
https://unity3d.com/unity/features/job-system-ECS ← available from Unity 2018.1, along with the new .NET 4.x.
They are slowly bringing concepts from entity / component into Unity.
Eventually they’ll phase out the GameObject, which is too central and requires too much stuff to be mono-threaded.
They explain why they made the choice
Let’s take a list of GameObject’s each having a Transform, a Collider and a RigidBody. Those parts are laid out in the memory sequentially, object per object. A Transform is actually a lot of properties, so accessing only a few of the properties of a Transform in many objects (like particles) will be inefficient for cache accesses.
With the entity/component system, you need to request for the members that you are accessing, and it can be optimized for that. It can also be multi-threaded properly. All that is combined with the new Burst compiler, which generates more performant code based on the hardware.
Entities don't appear in the hierarchy, like Game Objects do.
In his demo, he manages to display 80,000 snowflakes in the editor instead of 13,000.
Here is some example code:
public struct SnowflakeData: IComponentData { public float FallSpeedValue; public float RotationSpeedValue; } public class SnowflakeSystem: JobComponentSystem { private struct SnowMoveJob: IJobProcessComponentData { public float DeltaTime; public void Execute(ref Position pos, ref Rotation pos, ref SnowflakeData data) { pos.Value.y -= data.FallSpeedValue * DeltaTime; rot.Value = math.mul(math.normalize(rot.Value), math.axisAngle(math.up(), data.RotationSpeedValue * DeltaTime)); } } protected override JobHandle OnUpdate(JobHandle inputDeps) { var job = new SnowMoveJob { DeltaTime = Time.DeltaTime }; return job.Schedule(this, 64, inputDeps); } } public class SnowflakeManager: MonoBehaviour { public int FlakesToSpawn = 1000; public static EntityArchetype SnowFlakeArch; [RuntimeInitializeOnLoadMethod(RuntimeInitializeLoadType.BeforeSceneLoad)] public static void Initialize() { var entityManager = World.Active.GetOrCreateManager(); entityManager.CreateArchetype( typeof(Position), typeof(Rotation), typeof(MeshInstanceRenderer), typeof(TransformMatrix)); } void Start() { SpawnSnow(); } void Update() { if (Input.GetKeyDown(KeyCode.Space)) { SpawnSnow(); } } void SpawnSnow() { var entityManager = World.Active.GetOrCreateManager(); NativeArray snowFlakes = new NativeArray(FlakesToSpawn, Allocator.Temp); // temporary allocation, so that we can dispose of it afterwards entityManager.CreateEntity(SnowFlakeArch, snowFlakes); for (int i = 0; i < FlakesToSpawn; i++) { entityManager.SetComponentData(snowFlakes[i], new Position { Value = RandomPosition() }); // RandomPosition made by the presenter entityManager.SetSharedComponentData(snowFlakes[i], new MeshInstanceRenderer { material = SnowflakeMat, ... }); entityManager.AddComponentData(snowFlakes[i], new SnowflakeData { FallSpeedValue = RandomFallSpeed(), RotationSpeedValue = RandomFallSpeed() }); } // Dispose of the array snowFlakes.Dispose(); // Update UI (variables made by the presenter) numberOfSnowflakes += FlakesToSpawn; EntityDisplayText.text = numberOfSnowflakes.ToString(); } }
Conclusion
We hope that you enjoyed reading this little summary of some of the presentations which we attended to.
As a general note, I would say that Unite is an event aimed at hardcore Unity fans. There is some time for networking with Unity engineers between the sessions (who come from all around the world), and not many beginners. It can be a good time to extend professional connections with (very serious) people from the Unity ecosystem, and not great for recruiting for instance. But you have to go for it and make it happen. By default the program will just have you follow sessions one after each other with value that is probably similar to what you would have by watching official presentations in a few weeks or months from now on YouTube. I’m a firm believer that socializing is way better than watching videos from home, so you won’t get me saying that it’s a waste of time, but if you are to send people there the best is when they are proactive and passionate themselves about Unity. If they just use it at work, I feel that the value is rather small, and I would even dare that it’s a bit a failure from the Unity team, as it can be hard to see who they are targeting.
There is also the Unite party, which you have to book way before, that may improve value for networking, but none of us could attend.
1 note · View note
yudiz123blog · 5 years ago
Text
AR with Unity! | Image Recognition Using AR Foundation
Tumblr media
Introduction to AR Foundation:
Augmented Reality can be used in Unity through AR Foundation. This interface makes Unity developer work easy. To use this package you also need some plugins which are mentioned below.
AR Foundation includes core features from ARKit, ARCore, Magic Leap, and HoloLens, as well as unique Unity features to build robust apps that are ready to ship to internal stakeholders or on any app store. This framework enables you to take advantage of all of these features in a unified workflow.
AR Foundation lets you take currently unavailable features with you when you switch between AR platforms.
The AR-related subsystems are defined in the AR Subsystems package. These APIs are in the UnityEngine.Experimental.XR namespace, and consist of a number of Subsystems.
AR Foundation uses Monobehaviours and APIs to deal with devices that support the following concepts :
Device tracking: track the device’s position and orientation in physical space.
Plane detection: detect horizontal and vertical surfaces.
Point clouds, also known as feature points.
Anchor: an arbitrary position and orientation that the device tracks.
Light estimation: estimates for average color temperature and brightness in physical space.
Environment probe: a means for generating a cube map to represent a particular area of the physical environment.
Face tracking: detect and track human faces.
2D image tracking: detect and track 2D images.
3D object tracking: detect 3D objects.
Meshing: generate triangle meshes that correspond to the physical space.
Body tracking: 2D and 3D representations of humans recognized in physical space.
Collaborative participants: track the position and orientation of other devices in a shared AR experience.
Human segmentation and occlusion: apply distance to objects in the physical world to rendered 3D content, which achieves a realistic blending of physical and virtual objects.
Raycast: queries physical surroundings for detected planes and feature points.
Pass-through video: optimized rendering of mobile camera image onto the touch screen as the background for AR content.
Session management: manipulation of the platform-level configuration is automatically when AR Features are enabled or disabled.
(Above concepts reference are taken from: https://docs.unity3d.com/Packages/[email protected]/manual/index.html)
Feature Support per Platform:
Tumblr media
Prerequisites
Unity 2019 version or above.
Android SDK level 24 and above.
Android device which has Android 7.0 and above.
ARCore XR plugin:
Currently, [email protected] preview is available. By adding this plugin it enables the use of ARCore support via unity multi-platform XR API Supported Features are,
Background Rendering
Horizontal Planes
Depth Data
Anchors
Hit Testing
For more current version information visit: https://docs.unity3d.com/Packages/[email protected]/manual/index.html
ARKit XR plugin:
Currently, [email protected] preview is available. By adding this plugin it enables the use of ARKit support via unity multi-platform XR API Supported Features are,
Efficient Background Rendering
Horizontal Planes
Depth Data
Anchors
Hit Testing
Face Tracking
Environment Probes
For more current version information visit: https://docs.unity3d.com/Packages/[email protected]/manual/index.html
Image Recognition Using AR Foundation:
Setup
To install AR Foundation you need to go window -> package manager, Press The Advance Button, and select show preview packages after that you will need to install AR Foundation version 2.2.0-preview.06, We also require to download the ARCore XR Plugin, ARKit XR Plugin the same way the version should be,
ARCore XR Plugin(For Android) : 2.2.0-preview.06 ARKit XR Plugin(For IOS) : 2.2.0-preview.06
Tumblr media
Note: make sure the SDK you are using are updated beforehand. (The above-mentioned version are working for me that why I used these versions.)
Player Settings
In this player, the setting adds the company name and package name you want to give. Then enable AutoGraphic API. Then make Minimum API level to 24 and Target API level to highest.
Tumblr media
Hierarchy
Add AR Session, AR Session Origin from pressing right-click on hierarchy go in XR tab and add them.
Tumblr media
Then inside AR Session origin is AR camera give it the tag of the Main Camera.
Inspector
Click on AR Session origin and add AR Tracked Image Manager component.
Tumblr media
We now need a Serialized library and image prefab.
Reference Image Library
Go in project tab Press right click Create – >XR -> ReferenceImageLibrary. This is a Serialized object, so we need to populate it.
Tumblr media
After adding the data, give reference to the AR Tracked Image Manager component.
Prefab
Create a cube in the scene adjust it and drag it to the project tab and give reference to the AR Tracked Image Manager component.
Script
Create a script for recognition of image and what you want to do with it through AR Tracked Image Manager component.
Tumblr media
Add the code written in the script above.
Build
Press CTRL + B to build your app, make sure you add the scene.
Now it’s completed you can open the image which you add in the reference library and you can see the prefab you added AR Tracked Image Manager component
Video
For the video click on: https://youtu.be/kYZiPnGhNsM
Conclusion:
Hence, Image Recognition is one of the features which are helpful in AR Related Applications, and hence it will be very useful in a Games.
This was just a basic setup to start working with AR Foundation in unity. There are many more features available for user-interactive gaming. Image Recognition is an example of engaging gaming experience.
This tutorial can be used to work with any game or application.
0 notes
teamyesterday · 5 years ago
Text
Dev Log 4/20 - UI Changes and the Four Corners Minigame - Ian
Since the project has started I’ve accomplished three major things: 
- Restructuring and importing the Combat System that was created in a previous CS 499 project. 
- Modifications to the combat interface.
- Creation of the Four Corners minigame.
The first thing I did was package (only the essential) files, scripts, and objects that were used in the combat scene by checking the “Include dependencies” (see screenshot below) option when exporting Unity Package. I then took the exported Unity package and imported it and fixed any errors that were caused from taking what the previous combat team had did, which was in a different version of Unity, and putting that into the new working Unity project.
Tumblr media
My second task was the improvements to the user interface. For this Dr. Byrd and Harrison requested that a “box” be added at the bottom of the screen similar the original Final Fantasy 7′s user interface - see screenshot. The reason why they wanted the box was because it was difficult to tell what was going on in the combat scene while the minigame was active. Essentially, the minigame covered the majority of the middle of your screen, making it so you couldn’t see health or when the enemy is “auto attacking,” which happens at set intervals.
Tumblr media
To do this I simply added a UI panel object and had it cover a desired amount of the screen - see screenshot. 
Tumblr media
After showing this to Dr. Byrd and Dr. Harrison we agreed that it was a bit jarring. Instead, we decided that the minigame should be localized to this bottom panel, but the bottom panel should not be visible. This is how it currently works - see screenshot below. Also, I made some minor improvements by moving the health bars over the player’s characters, and shifted the characters up slightly, making their bodies more visible in combat. 
Tumblr media
The third and most involved of the things I implemented was the implementation of the Four Corner’s minigame, The idea for the minigame is that this should be the evaluative minigame, it takes what the other minigames or parts of the game have taught the player and it directly quizzes their understanding. Essentially, the player controls the square in the middle of the panel, and they have to navigate their square to the PIE word which matches the given meaning before the timer runs out. 
Using the minigame structure that Vince implemented I began creating a minigame prefab that would be instantiated whenever the minigame is to be played. I did this by creating a UI Canvas object in the MinigameTestScene scene and filling it with six image elements, one in each corner and two in the center. Each corner image represents a PIE word that the player could guess. Three of which are incorrect and one of which is correct. One of the images in the center is the player, and the other is the meaning of the correct word. (See above screenshot of UI). Each image has a text element child, which will be set to the word/meaning of the word. I also added a slider object which is used to keep track of the time remaining. 
In script I used the MinigameController class to construct a controller script for my minigame, I called it MinigameControllerFourCorners.cs. I defined the given abstract Initialize method, which takes the correct PIE word as a parameter (the word is given when the controller is called). I took this given PIE word and I set its meaning to a text element in the center of the screen. I also set the correct word to the first element in an array of words, randomWordList, and set it to a public variable which stores the correct value. The public variable is referenced by another script, the array is to be randomized later. I then generate three random unique PIE words and stored them randomWordList. I then randomized the array and set a each element of the array to a text object located in each of the four corners. See screenshot.
Tumblr media
Similarly, I implemented a Win() and a Lose() function which is used by the player object. And I implemented a simple timer in the update method. 
Tumblr media
Finally, I implemented a script to control the player, called PlayerController.cs. This is used by the player object to control movement and collision detection. Each of the objects located at the corners have collider components. If the player collides with one of these objects I use the OnCollisionEnter2D(..) Unity function to manipulate this collision. Essentially, PlayerController.cs references MinigameControllerFourCorners.cs and compares the value of the correctText (set in minigame controller) and the text value of the element in which the player collides with. If they match, the player wins, if they don’t the player fails. I also implemented simple movement using Input.GetAxisRaw and transform.Translate. 
Tumblr media
words: 800
0 notes
siva3155 · 5 years ago
Text
300+ TOP UNITY 3D Interview Questions and Answers
UNITY 3D Interview Questions for freshers experienced :-
1. What is Unity 3D? Unity 3D is a powerful cross-platform and fully integrated development engine which gives out-of-box functionality to create games and other interactive 3D content. 2. What are the characteristics of Unity3D? Characteristics of Unity is It is a multi-platform game engine with features like ( 3D objects, physics, animation, scripting, lighting etc.) Accompanying script editor MonoDevelop (win/mac) It can also use Visual Studio (Windows) 3D terrain editor 3D object animation manager GUI System Many platforms executable exporter Web player/ Android/Native application/Wii In Unity 3D, you can assemble art and assets into scenes and environments like adding special effects, physics and animation, lighting, etc. 3. What is important components of Unity 3D? Some important Unity 3D components include Toolbar: It features several important manipulation tools for the scene and game windows Scene View: It is a fully rendered 3 D preview of the currently open scene is displayed and enables you to add, edit and remove GameObjects Hierarchy: It displays a list of every GameObject within the current scene view Project Window: In complex games, project window searches for specific game assets as needed. It explores the assets directory for all textures, scripts, models and prefabs used within the project Game View: In unity you can view your game and at the same time make changes to your game while you are playing in real time. 4. What is Prefabs in Unity 3D? Prefab in Unity 3D is referred for pre-fabricated object template (Class combining objects and scripts). At design time, a prefab can be dragged from project window into the scene window and added the scene's hierarchy of game objects. If desired the object then can be edited. At the run time, a script can cause a new object instance to be created at a given location or with a given transform set of properties. 5. What is the function of Inspector in Unity 3D? The inspector is a context-sensitive panel, where you can adjust the position, scale and rotation of Game Objects listed in Hierarchy panel. 6. What's the best game of all time and why? The most important thing here is to answer relatively quickly, and back it up. One of the fallouts of this question is age. Answering "Robotron!" to a 20-something interviewer might lead to a feeling of disconnect. But sometimes that can be good. It means you have to really explain why it's the best game of all time. Can you verbally and accurately describe a game to another person who has never played it? You'll rack up some communication points if you can. What you shouldn't say is whatever the latest hot game is, or blatantly pick one that the company made (unless it's true and your enthusiasm is bubbling over). Be honest. Don't be too eccentric and niche, and be ready to defend your decision. 7. Do you have any questions regarding us? Yes. Yes, you do have questions. Some of your questions will have been answered in the normal give-and-take of conversation, but you should always be asked if you have others (and if not, something's wrong). Having questions means you're interested. Some questions are best directed to HR, while others should be asked of managers and future co-workers. Ask questions that show an interest in the position and the long-term plans of the company. For some ideas, see "Questions You Should Ask in an Interview," below. 8. What are the characteristics of Unity3D Characteristics of Unity is It is a multi-platform game engine with features like ( 3D objects, physics, animation, scripting, lighting etc.) Accompanying script editor MonoDevelop (win/mac) It can also use Visual Studio (Windows) 3D terrain editor 3D object animation manager GUI System Many platforms executable exporter Web player/ Android/Native application/Wii In Unity 3D, you can assemble art and assets into scenes and environments like adding special effects, physics and animation, lighting, etc. 9. List out some best practices for Unity 3D Cache component references: Always cache reference to components you need to use your scripts Memory Allocation: Instead of instantiating the new object on the fly, always consider creating and using object pools. It will help to less memory fragmentation and make the garbage collector work less Layers and collision matrix: For each new layer, a new column and row are added on the collision matrix. This matrix is responsible for defining interactions between layers Raycasts: It enables to fire a ray on a certain direction with a certain length and let you know if it hit something Physics 2D 3D: Choose physics engine that suits your game Rigidbody: It is an essential component when adding physical interactions between objects  Fixed Timestep: Fixed timestep value directly impacts the fixedupdate() and physics update rate. 10. What do you do on your own time to extend your skills? As a programmer, do you work on home projects? As a designer, do you doodle design ideas or make puzzles? As an artist, do you do portrait work? Having hired many people in the past, one of the things I can speak to with authority is that those people who spend their off time working on discipline-related projects are the ones who are always up on current trends, have new ideas, are most willing to try something new, and will be the ones taking stuff home to tinker with on their own time. Now that shouldn't be expected of everyone, but the sad reality is that there is competition for jobs out there, and those who are prepared to put in the extra work are the ones that are going to be in hot demand. Demonstrating that you learned C# over a weekend because you thought it was cool for prototyping is exactly the kind of thing a programming manager wants to hear. Suddenly your toolset expanded, and not only did it show willingness to do something without being told, it makes you more valuable. The only care to here is to not mention an outside situation that might detract from or compete with your day job.
Tumblr media
UNITY 3D Interview Questions 11. How do you feel about crunching? At smaller studios, this is the 64 million dollar question. My advice is to be 100 percent honest. If you won't crunch, say so now. It may well put you out of the running for a job, but ultimately that's a good thing. No, really, it is! If the company works a lot of overtime and you don't want to do it, then taking the job is going to be punishing for everyone. Having said that, the last thing any interviewer wants to hear is, "I won't do it" because that predicates a perceived lack of involvement and passion (not that passion should equal overtime, but the perception of refusing to do something before you're even in the circumstances could be the difference between getting a job offer and having the company pass you up). Phrase your answer in such a way that you don't sound confrontational with the interviewer. She doesn't want to get into an argument; she just wants to know where you stand. Understand that this question is meant to gauge, roughly, how you might fit into the company culture. 12. How would you make the games you're playing better? You'd be surprised how often this question comes up, even if you aren't interviewing for a design position. Everyone wants a developer who has design sensibilities because it inevitably means she or he will be more involved and engaged in whatever is going on. Knowing ahead of time how you might answer this question means you'll come off sounding like you've actually thought about a game in development terms. Game studios are looking for people who think as they play -- about what they're playing, how it's done, what could have been improved, and most importantly, what they can rip off. One downside to adopting this mentality is that it becomes harder to enjoy a game for what it is, but that's an occupational hazard in all jobs. Believe it or not, you can answer this question in an entirely positive way. However, if you decide instead to criticize a design or implementation decision in a game, be sure you have a solution to the problem too. It's not enough to moan about the final strider battle in Half-Life 2: Episode 2; you have to have an idea of how it could have been made more enjoyable, perhaps through easier car control, or not destroying all the supply stations so quickly. If you decide to bash a game that the company where you're interviewing developed (and that takes courage; some companies will applaud you while others will diss you for not drinking the Kool-Aid), then ensure that what you're criticizing isn't something subjective but something that everyone has had a pop at. Be ready to back up the criticism with proof that it's an agreed-upon flaw, not just you being nit-picky. 13. Explain what a vertex shader is, and what a pixel shader is? Vertex shader is a script that runs for each vertex of the mesh, allowing the developer to apply transformation matrixes, and other operations, in order to control where this vertex is in the 3D space, and how it will be projected on the screen. Pixel shader is a script that runs for each fragment (pixel candidate to be rendered) after three vertexes are processed in a mesh's triangle. The developer can use information like the UV / TextureCoords and sample textures in order to control the final color that will be rendered on screen. 14. Where do you want to be in five years? Personally, I love this question because it reveals if a prospective candidate has a plan at all or is just drifting from job to job as so many are wont to do. There's nothing wrong per se with people who drift along the currents, it's just that those with a plan (or at least a desire to move in a particular direction) are generally much more interesting people. Plus, they are almost always inherently more predictable, which is always a benefit for employers. Having a desire to move forward helps everyone. It helps you measure your progress, and it gives the company a plan to help you get there. Of course, it does depend on you knowing what you want. Most people tend to know what they don't want, but not necessarily what they do want, which is a problem -- particularly if you express that in an interview. Interviewers would rather have a list of things you want to attain rather than things you don't. One optimal answer is, "Still working for you making games," but it smacks of sucking up, so I'd recommend saying something a little more generic: "Still looking for a challenge and putting in that extra effort to make great games." The best response I've ever heard to that question was, "I want your job!" and the individual who said it to me indeed has my old job! But be wary of sounding confrontational. 15. Why vectors should be normalized when used to move an object? Normalization makes the vector unit length. It means, for instance, that if you want to move with speed 20.0, multiplying speed * vector will result in a precise 20.0 units per step. If the vector had a random length, the step would be different than 20.0 units. 16. Why do you want to work here as Unity3D Developer? (This question implicitly includes, "Why do you want to leave where you are?" if you're currently employed.) This question is an open opportunity to show you've done some research on the company where you're interviewing. All companies and interviewers are flattered when the interviewee knows who they are, knows what games they make, and wants to be a part of their experience. Do your homework and put on a good show! Don't say things like, "I need a job," or "I need to move to Sacramento." Instead, pick a few things that are germane to the company in question. The more specific your reasons are tied to the company, the better. "I want to work on FPS shooters" isn't as good an answer as "I want to work on Game Franchise X because I played the first two games and still see potential for future growth of the product." It's sycophantic, yes, but interviewers are as prone to flattery as anyone else -- although don't give that as your only reason. When explaining why you want to leave your current job, the trick is to not be negative. Pick a couple of points that are inarguable, for example, "There was no career development" or "They weren't working on the kinds of games I'm interested in," rather than "Their management is clueless and they are going to die soon." The game industry is a small community -- you could very well be talking smack about your interviewer's close buddy. If you were let go or fired, it's better to say something like, "We decided to part ways," or "It was my time to leave," rather than go into too much detail, unless directly pressed. In that case, the interviewer probably already knows what went down and is just looking to see what you'll say. Answer the question quickly and without negativity, and move on. You want to leave a positive impression. 17. Why deferred lighting optimizes scenes with a lot of lights and elements? During rendering, each pixel is calculated whether it should be illuminated and receive lightning influence, and this is repeated for each light. After approximately eight repeated calculations for different lights in the scene, the overhead becomes significant. For large scenes, the number of pixels rendered is usually bigger than the number of pixels in the screen itself. Deferred Lighting makes the scene render all pixels without illumination (which is fast), and with extra information (at a cost of low overhead), it calculates the illumination step only for the pixels of the screen buffer (which is less than all pixels processed for each element). This technique allow much more light instances in the project. 18. Can two GameObjects, each with only an SphereCollider, both set as trigger and raise OnTrigger events? Explain your answer? No. Collision events between two objects can only be raised when one of them has a RigidBody attached to it. This is a common error when implementing applications that use "physics." 19. What is an Unity3D file and how can you open a unity3d file? A Unity3D files are scene web player files created by Unity; an application used to develop 3D games. These files consist of all assets and other game data in a single archive, and are used to enable gameplay within a browser that has the Unity Web Player Plugin. The assets within a 3D unity file are saved in a proprietary closed format. 20. What's your biggest weakness? Or, if I hired you, what would I regret about it in six months? This is a common question in all job interviews. There are generally two kinds of responses: the brutally honest and damning one ("I get upset with people who don't carry their load"), and the sycophantic one ("I'm a perfectionist"). What most employers are looking for is an honest answer that is followed up with an example of something you have done to work on your weakness. For example, you can say, "My workspace tends to become extremely disorganized," as long as you follow it up with, "but recently, I've put in a lot of effort to go paperless, and I'm extremely systematic in the way I manage my email inbox." The other secret to this question is not so much in the answer but how long you take to respond. If you answer too quickly, you might be suggesting that you already know all your worst points because they are blatantly obvious and you've been told so many times. If you take too long, it will seem as if you're searching for an answer that sounds good, doesn't make you look bad, and is something the interviewer would be happy to hear. Again, it gives the perception that you are being ingratiating rather than honest. By the way, the best answer I've heard is, "I don't know. What do you think I'd regret in six months if I worked here?" 21. What is Fixed Timestep in Unity3D? Why does Fixed Timestep setting affect game speed? Fixed Timestep feature helps to set the system updates at fixed time interval. A queue like mechanism will manage all real-time events that are accumulated between time epochs. If frame-rate drops below some threshold limit set for fixed timestep, then it can affect the game speed. 22. Explain, in a few words, what roles the inspector, project and hierarchy panels in the Unity editor have. Which is responsible for referencing the content that will be included in the build process? The inspector panel allows users to modify numeric values (such as position, rotation and scale), drag and drop references of scene objects (like Prefabs, Materials and Game Objects), and others. Also it can show a custom-made UI, created by the user, by using Editor scripts. The project panel contains files from the file system of the assets folder in the project's root folder. It shows all the available scripts, textures, materials and shaders available for use in the project. The hierarchy panel shows the current scene structure, with its GameObjects and its children. It also helps users organize them by name and order relative to the GameObject's siblings. Order dependent features, such as UI, make use of this categorization. The panel responsible for referencing content in the build process is the hierarchy panel. The panel contains references to the objects that exist, or will exist, when the application is executed. When building the project, Unity searches for them in the project panel, and adds them to the bundle. 23. Why Time.deltaTime should be used to make things that depend on time operate correctly? Real time applications, such as games, have a variable FPS. They sometimes run at 60FPS, or when suffering slowdowns, they will run on 40FPS or less. If you want to change a value from A to B in 1.0 seconds you can't simply increase A by B-A between two frames because frames can run fast or slow, so one frame can have different durations. The way to correct this is to measure the time taken from frame X to X+1 and increment A, leveraging this change with the frame duration deltaTime by doing A += (B-A) * DeltaTime. When the accumulated DeltaTime reaches 1.0 second, A will have assumed B value. 24. Which of the following examples will run faster? 1000 GameObjects, each with a MonoBehaviour implementing the Update callback. One GameObject with one MonoBehaviour with an Array of 1000 classes, each implementing a custom Update() callback? The correct answer is 2. The Update callback is called using a C# Reflection, which is significantly slower than calling a function directly. In our example, 1000 GameObjects each with a MonoBehaviour means 1000 Reflection calls per frame. Creating one MonoBehaviour with one Update, and using this single callback to Update a given number of elements, is a lot faster, due to the direct access to the method. 25. Arrange the event functions listed below in the order in which they will be invoked when an application is closed: Update() OnGUI() Awake() OnDisable() Start() LateUpdate() OnEnable() OnApplicationQuit() OnDestroy() The correct execution order of these event functions when an application closes is as follows: Awake() OnEnable() Start() Update() LateUpdate() OnGUI() OnApplicationQuit() OnDisable() OnDestroy() Note: You might be tempted to disagree with the placement of OnApplicationQuit() in the above list, but it is correct which can be verified by logging the order in which call occurs when your application closes. 26. Okay, we're going to work through a problem here? Often in game job interviews, you will be presented with a problem to solve, or even a full-blown test, depending on the position. It might be grease board work, it might be a conversation, it might be a level design test, it might even be a code test at a PC. The premise is that the interviewer wants to see how you work. Often, once you've answered the question, the interviewer will change the parameters to see what you'll do. But what do you do if you have no clue what's being asked, or if it's outside your area of expertise---> That's a panic moment if there ever was one. Take a deep breath and realize that this is a moment where you need to say, "I'm not sure I understand the question," or "That's not something I've done before." But immediately after that, start asking questions about the problem and take a stab at solving it. That's one of the biggest things you can do at this point -- admit ignorance then have a go anyway. Showing a willingness to try something outside your field of knowledge is huge to interviewers. It shows you want to learn and be more than what you are now. Sometimes, the fact that you tried is more important than the actual result, and sometimes, you'll have an interviewer who will give you hints toward a solution just because you showed that willingness to try. The more junior you are the more likely this is to happen. Occasionally, interviewers will deliberately put you out of your comfort zone just to see how you'll react, so be aware! 27. Consider the following code snippet below: class Mover : MonoBehaviour { Vector3 target; float speed; void Update() { } } Finish this code so the GameObject containing this script moves with constant speed towards target, and stop moving once it reaches 1.0, or less, units of distance? ANS:- class Mover : MonoBehaviour { Vector3 target; float speed; void Update() { float distance = Vector3.Distance(target,transform.position); // will only move while the distance is bigger than 1.0 units if(distance > 1.0f) { Vector3 dir = target - transform.position; dir.Normalize(); // normalization is obligatory transform.position += dir * speed * Time.deltaTime; // using deltaTime and speed is obligatory } } } 28. Can threads be used to modify a Texture on runtime? Can threads be used to move a GameObject on the scene? Consider the snippet below: class RandomGenerator : MonoBehaviour { public float randomList; void Start() { randomList = new float; } void Generate() { System.Random rnd = new System.Random(); for(int i=0;i What's your favorite book? Movie? TV show? ---> Do you prefer open worlds or well-defined quest lines? Do you think a game should/can have both? ---> What's your favorite character class? ---> How would you briefly describe the mechanics of your favorite game to a non-programmer? ---> Do you usually play games to the end? ---> What's your Beta test experience? (No, you're not looking for a QA person BUT it doesn't hurt to hire a programmer who thinks like a QA person at least a little, as in being able to vet their own work before they hand off a fix as "done.") ---> What's your favorite game of ours and why? (If you've only published one game, they better have played it! And listen for their own words-if they sound like they're parroting what they read about your game, it's entirely possible they haven't actually played it.) ---> If you could work in any other area of our industry, what would it be and why? What makes a game fun for you? 31. List out the pros and cons of Unity 3D? Pros: It uses JavaScript and C# language for scripting Unity provides an Asset store where you can buy or find stuff, that you want to use in your games You can customize your own shaders and change the way how Unity renders the game It is great platform for making games for mobile devices like iOS, Android and Web (HTML5) Cons: Compared to Unreal Engine it has got low graphics quality Interface not user-friendly and it is hard to learn especially for beginners It requires good programming knowledge as such most of the stuff runs on Scripts 32. What will you bring to the team? Why do we need you? This is a general question that applies to all interviews. There are two ways to answer: the big answer and the little answer. The big answer requires you to have some knowledge of how the company operates. Who does what---> Your goal is to slot your experience, passion and skills (and if you are a student, your passion, skills, and desired career direction) into any holes the company may have -- and it should have some. Otherwise, why are they hiring---> The little answer is to name some of your previous experiences and best qualities and hope that's enough. Care needs to be taken that a) you don't sound arrogant in assuming the company will die without you and b) you don't say negative things about the company. Statements like, "Well, you obviously can't do good Q/A. You need a good Q/A manager," are likely to go down like a lead balloon. Frame your answer to suggest that you would bring extra expertise, and therefore improvement, to something that's already in place. 33. What game would you make if money were no object Everyone has a pet project they would want to make if they had the chance -- it's just inherent in the game developer psyche. This is your chance to expound on it, and the more realized your idea is, the more it will be seen as proof that you know what you're doing. Taking an existing idea and adding, "but I'd make it cooler!" isn't the answer (the number of times I've heard Q/A staff wanting to become developers tell me they want to remake Counter Strike "but better" is staggering); it just shows you have enthusiasm, but no original ideas. Bonus points if you can take an existing IP license and make a compelling argument for a game out of it. People who can actually do that are at a premium in our industry since most tie-ins, well, suck. 34. What games are you playing? If you plan to work for a video game company, you'd better be playing games -- and you'd better be able to demonstrate that. It's good form to mention some games that are in the same genre as the games made at that company. It's even better if you mention playing some of the games that were actually made there. Again though, don't go over the top. At the very least, play the demo of anything they've produced. You need to be knowledgeable about the genre, what you enjoy about it, and how the development of these games is affected by the genre (as much as you can be). So research the company before the interview. How you answer this question can be a deal breaker or a deal maker for hiring managers. They want to hire people who are demonstrably passionate about the games their company makes. Saying, "I have a level 70 mage in World of Warcraft and a level 40 druid in EverQuest," to Blizzard makes the point that you are immersed in its product genre. Demonstrating some knowledge about older games also shows you're grounded in game history, which is never a bad thing. The wider your knowledge base, the more you can forestall going down blind alleys in terms of implementation and design, which benefits everyone, and that's exactly what a company is looking for in its employees. 35. List out some key features of Unity3D UE4 ( Unreal Engine 4)? UE4: Game logic is written in C++ or blueprint editor Base scene object- Actor Input Events- Component UInputComponent of Actor class Main classes and function of UE4 includes int32,int24, Fstring, Ftransform, FQuat, FRotator, Actor and TArray To create a new instance of a specified class and to point towards the newly created Actor. UWorld::SpawnActor() may be used UI of Unreal Engine 4 is more flexible and less prone to crashes It does not support systems like X-box 360 or PS3, it requires AMD Radeon HD card to function properly Less expensive compare to Unity3D To use UE4 you don't need programming language knowledge Unity3D: Game logic is written using the Mono environment Base scene object- GameObject Input events- Class Input Main classes and function include int,string,quaternion,transform, rotation, gameobject, Array To make a copy of an object you can use the function Instantiate() The asset store of this tool is much better stacked than UE4 It supports wide range of gaming consoles like X-box and PS4, as well as their predecessors Unity3D has free version which lacks few functionality while pro version is bit expensive in compare to UE4 It requires programming language knowledge 36. What is the use of AssetBundle in Unity3D? AssetBundles are files that can be exported from Unity to contain asset of your choice. AssetBundles are created to simply downloading content to your application. 37. In Unity 3D how can you hide gameobject? To hide gameobject in Unity 3D, you have to use the code gameObject.transform.SetActive(false); 38. Questions You Should Ask In Unity3D Interview: What are the core working hours? How do you assign or schedule tasks? Who gets to decide who does what and estimates time? What's the career path for this job? How do I get to progress? What is the process for promotion? What training approach do you use? How would I learn new skills? How are personnel reviews handled? Who does them and how often? Are there any specific development processes used here, for example, Scrum? Who would I report to? If I'm hired, what is the next game I might work on---> How much input would I have on that? Is there a relocation package? What bonus structure or incentives are there? UNITY 3D Questions and Answers Pdf Download Read the full article
0 notes
urukyra · 6 years ago
Text
Min’atoa Station Post Mortem
Min’atoa Station, my 6-month capstone project for my Game Development course at Yoobee Colleges, in which I fall down a rabbit hole, drown in a pool of tears and learn to make magic.  Or, less poetically, scope too big, lose and remake multiple assets multiple times, and launch a game that falls well short of my goal - yet shows a glimpse of a potentially amazing experience. 
My aim was a linear 3D narrative game - think Gone Home in a Myst type setting with a terrorist theme.. I reckon I got halfway, so there’s only 90% left to go. 
Tumblr media
Team vs Solo 
Tutors urged us to push our boundaries, and in my Goodest Boi team I stretched my wings into new areas and thrived. In other teams (whether real or not) I’d felt held back by low expectations. My mantra was ‘play big’. I love to learn, and that means embracing looking stupid, stumbling before you can walk. I chose a solo project, so I’d be propelled to shine, and I was pleased I did. The gasps of surprise from the class even at my prototype were validating. A teacher once said more learners rust out than burn out - having a tutor that believed in me created its own empowerment magic.
New Idea vs Darling 
I was torn between:
A new game, designed for addictive game-play loops, replayability, marketing hooks, commercial 
Min’atoa - unknown market, unproven gameplay, not replayable, huge scope, high risk. 
The gamedev mantra “kill your darlings” echoed in my head. I brainstormed great alternatives that I loved. And yet, YOLO, carpe diem. I left a ‘safe’ life doing what I was told for this. The window was open - now or never.  I’d never “finish” Min’atoa left to my own devices. It needed a structure for existence - the force-field of deadlines, accountability, of expert help. I knew it was too big, so I ‘maimed’ my darling - reduce scope, use existing assets, basic textures, no puzzles - just a story game. That seemed do-able (cue evil laughter). So I talked myself into ‘story-only-Min’atoa.’ Call me crazy, but I don’t regret it.
Tumblr media
Tools
I got overwhelmed.comparing narrative tools: Inkle, Twine, Yarnspinner, Ren’Py, Ink. Prairie, Fungus, Novel-software, Scrivener. My author friend Peter cut through my angst by wryly observing that Shakespeare used a quill pen. For my Myst-type story - linear, non-branching, no dialogue - Google docs was fine!  
Writers Block
Although I had a plot, I couldn’t start. Tutor Matt P got me to put story beats in linear time order, then rearrange them into the order the story needed. Write short  ‘memory joggers’ of each plot movement onto Post-It notes. This simple process broke the writer’s block. 
Tumblr media
Later, I found myself blocked again, and dedicated an entire week turning these brief notes into strings of story.
Tumblr media
Story gating
Player autonomy is a key feature of game design. In game writing, each story elements has to stand on its own in whatever order players come across them, and the plot still has to make sense.  So story games build artificial ‘gates’ to order key story elements (eg in locked rooms), to achieve greater dramatic tension and plot cohesion,  
I fit the plot into the game’s natural gates: portal, balcony, room, and controller balcony, and Arrivals (and later,,Departures) desk drawers; Doors opened with buttons, and crystal docks.. I felt clever making drawer locks, and hiding keys and crystals. The gates were not infallible, but ‘good enough’ in playtesting. 
Tumblr media
The downside was, the story game became a puzzle game. It changed how players played, from a slow pace that encouraged reading, to active, testing interactable objects to see what they did. In retrospect, i wish I’d deliberately designed for the slower game feel of Gone Home, where players interact with passive objects whose function was to add atmosphere.
Story Element Workflow
There were 25 notices and 21 letters, each with two gameObjects - players click on a 3D object in the scene, to bring up UI with its matching 2D readable. This meant 92 assets whose materials change when I edit their words. A simple workflow was essential. 
Playtesters noted the 25 notices were easily legible; that removed 25 UI assets. 
Tumblr media
For the 21 letters, I felt clever about my idea of stationery. I made stationery (paper, design and font) for each character (Tris, S’tiel, Priestess and Council). Each in-game letter automatically populated its UI stationery with a text string. Instead of 21 UI assets, I only needed four.
Tumblr media
I was smug about this at first - I’d polish text; and the UI automatically updated. But for beta, for the first time I had to have textures on all 46 3D documents. It was an awful workflow. I’d play through, click on each letter or notice, bring up UI stationery with its unique string, snip it, create 46 materials from the snips, and tile these to fit each gameObject. 
It was so tedious to change materials, it created a mental barrier to improving the text, even if it was way too long, or made me cringe. I deleted eight noticeboard posters that were too embarrassing. I left “Lorem Ipsum” text on most letters. I wish I’d fixed these. I did find ways to automate this process, but events overtook (see refactoring section) so I didn’t get time to code this.  
Tumblr media
Localisation
The more words, the more difficult and expensive is localisation, reducing the potential market, if I were doing it again, I would aim to 
reduce word count dramatically, and use more images
have illegible 3D textures (see for example Zelda, Breath of the Wild) or develop an alien text / symbols
retain the process of populating 2D UI assets with strings, so that it would be easy to populate strings in different languages.
Prioritise Your Intuition
Sam Fleury, Runaway Play gave an NZGDC talk on prioritising your intuition to reduce burnout and improve personal effectiveness. He noted that our tendency to try to ‘push through’ a wall was often driven by feeling unworthy. It often led to bad code (or other work) that had to be redone. 
When I noticed my brain clearly saw the chains of logic, I coded quickly and cleanly When my ‘programming mind’ lost it’s edge, went ‘fuzzy’, if I continued to push I wrote bad code, and felt burnt out. I found it took an active decision to resist the ‘imposter syndrome’ urge to push on. I’d step back, take a rest, or pivot to a task that used another part of the brain (art or story).  
I took note of sunshine, various foods, coffee and rest affected my focus and set up the right environment. Dancing barefoot on the grass is great therapy.
Tumblr media
More often than not, when I came back I’d see a bigger picture, and pivot to a different priority, or to a fresh, cleaner approach. I’d never pivot when I was nose to the grindstone. 
I heartily recommend this practice. It was vital for a solo dev on a big scope game.  Pacing myself was not costly, I wrote better code in less time with less stress. And it might seem obvious, but burnt out, tired devs don’t make games that are fun, intriguing, and delightful,
Attributions  
I wanted assets that left the option open to allow commercial use. This hugely limited choice for the game’s many imported assets: ~20 sound effects, music,  five fonts, five paper textures and plugins, . 
I recommend designing a good filing system for attributions. I did record them as I went, but not in one place. Finding them months later cost time I’d rather spend on my game. 
Brian and I made most of the images and icons from scratch. But right at the end I realised an image used fan art I’d made for #Myst25 from Riven, a game by Cyan Worlds, Commercial use violated their very generous terms for fan art.
Tumblr media
I wrote to Cyan, saying I’d remove it, but it’d be a lovely Easter egg for Myst fans. Hannah Gamiel, Director of Development, Cyan Inc immediately wrote back to give permission to use it, which was typical of the lovely Cyan approach to their fans. 
Tumblr media
Refactoring
We’d planned for Brian being away for 3 months OE, but didn’t factor in a month to reinstate his melted server and hospital with pneumonia. Since Yoobee had only Unity 2018, I’d coded the prototype, Sabotage, from scratch in 3 weeks. 
Tumblr media
Once Yoobee got Unity 2019, I reverted to Min’atoa with Brian’s code, which was robust and elegantly effective, But since we were both new to Unity it used unusual approaches - event signals, listeners via parented assets, and master controllers with enums. After painfully watching me struggle, my tutor Woody spelt out the stark choice: strip out Brian’s code and he’d help me rebuild it, or struggle on alone. 
I chose to strip it out. I really wanted to step up at coding, and Woody was brilliant at it. Although Min’atoa would not be finished to the level I wanted in other areas, I could do the writing and art later.  
Deleting two years of scripts left 416 fatal errors; removing ‘missing scripts’ from assets took hours. It would be an enormous task to rebuild. I brutally trimmed my asset list of Brian’s features (a fully functional inventory, and putting items down). and features I’d planned (writing, art, animations, codes and puzzles). 
Then I got intensive tutoring from Woody. I learned: 
keep it simple - add complexity only as required 
use prefabs - get one asset completely right, then 20 others work 
get the essentials (story gates) working - doors, drawers, lifts and locks. .. 
Within a month, most of the functions worked again. My crowning achievement was replacing Brian’s inventory with a scroll-selectable list that appears on hover (over a lock that takes multiple items) and shows what carried items fit in. 
Tumblr media
I was also pleased to get different endings working (including one that pivots the whole scene).  I’ll never know what Min’atoa would be if I’d made the other decision, but I do know I would not have learned as much as I did.
The biggest code drawback was no inventory for the 21 letters,they’re just lost . The player can’t refer back to any letter they’d collected. Woody had shown me how to do it, but I spent the remaining time fixing bugs I had, and improving art and gameplay. This is such a major drawback that if I get time, I’d like to issue a patch for it, and I (think) I now know enough to do it. . 
Result 
I launched a game that I have mixed feelings about. On one hand, I have another game in my public itch.io portfolio. Overall, for pretty much a solo game made in a limited time, it showcases my capabilities and adds to my credibility as a  game developer. 
On the other hand, I wish it were more fun, that the story was better, the puzzles more difficult, the game design was more complete. The main failing with the game was the story. I have so much to learn and I plan to fully engage with expert writing mentors next year to learn to
create empathy and connection with the main characters 
reveal through what’s not said, rather than tell
reduce word count (strict 140 character limit per item) 
use environmental storytelling.
Given the need to limit scope, I only included very basic puzzles, that were not at the level of complexity or engagement of good competitors, like Aporia or Eastshade. Brian and I had designed more complex puzzles, but specifically removed those for scope reasons. If I were to do it again, I would prioritise adding to the puzzle component in simple ways such as:
embed hidden clues, codes and hints
add images, sketches and drawings . 
Not having clues to choose different endings is a major omission. For much of the development, I placed puzzle items to make life easy for me, rather than for the player’s satisfaction, The player finds them in obvious places, one after the other repetitively, instead of having to use deductive reasoning.
Tumblr media
I would put more time into thinking about how to make it fun for the player to discover hidden items, work out lore-logical places for them, and hint at their location rather than make it so obvious. 
Conclusion
There are definitely things I want to improve in Min’atoa Station, but for now, the game is out ‘as is’. 
Next time I’d invest time early to “find the fun”. Find the fun in the story, puzzles, and gameplay from the player’s perspective as early as possible and build from a solid base of a proven enjoyable gameplay experience. 
At the end, my measure is not even the game. but instead what I have learned. I'm a person with new skills. Looking back my progress seems humbling and miraculous. In 2017, I first clumsily opened Photoshop, my first ever digital tool. In 2019, I made Min’atoa Station, a credible 3D game. Without diminishing Brian’s enormous contribution, or that of my tutors, it was ‘my game’. I designed the world, characters, story, gameplay, modeled and textured 3D assets, 2D assets, the menus, animations, lighting, audio, did the voice acting,  used many plugins and more..I ended up coding everything, I listened and learnt, I asked for help and got lots, I struggled and fought and.. Lo. 
As I reflect on the end of my time at Yoobee, my journey as a game developer has been, and I hope will continue to be, intense and exhilarating. To me, it’s been an incredible privilege to learn to make worlds from my imagination come to life. 
0 notes
imohsenreshadati · 6 years ago
Text
پروژه آماده بازی تفنگی Sci Fi Top Down Game Template برای یونیتی
Tumblr media
پروژه آماده بازی تفنگی Sci Fi Top Down Game Template برای یونیتی
Tumblr media
این پکیج شامل: - مدل شخصیت سرباز ، با رنگ های قابل تنظیم توسط سایه بان. - 2 مدل اسلحه (اسلحه و تفنگ) - تعداد زیادی انیمیشن. - کنترل کامل شخصیت و اسکریپت موجودی با استفاده از ماوس و صفحه کلید یا Joystick برای کنترل پخش کننده - کاملاً قابل تنظیم جزء سلاح ها ، که در آن می توانید از Fire Fire به دقت (واگرایی) و prefabs های قابل انتخاب گلوله ، سرعت ، چند گلوله در هر شات و… تغییر دهید. - چهار قبضه سلاح PREMADE (اسلحه ، تفنگ ، موشک پرتاب و اسلحه؛)) - پخش کننده HUD: سلامتی ، استقامت ، آمار اسلحه (نماد اسلحه ، گلوله و پیشخوان کلیپ) - رادار عینی موجود در دستگاه پخش (جهت و فاصله) - دوربین استاندارد Rig زیر پخش کننده را به راحتی و با تکان دادن دوربین پارامتری برای تیراندازی و انفجار پخش کنید - انتخاب بازیگر موارد (اسلحه ، کلید ، بهداشت یا مهمات) - درب های اتوماتیک ، قفل و قابل تنظیم (تک و دو برابر) - پانل های قابل تعامل برای استفاده در موقعیت های مختلف مانند باز کردن درها یا فعال کردن هر چیزی در صحنه. - هوش مصنوعی عمومی Enemy با استفاده از ایستگاههای بین راه و Unity Nav Meshes - Enemy Spawner ، با پارامترهای اساسی مانند میزان تخم ریزی ، حداکثر تخم ریزی ، تخم ریزی ناحیه و ... - مؤلفه مسیر Waypoint برای ایجاد مسیرهای ایستگاههای بین راه دشمن به راحتی و با استفاده از ترانسفورماتورها ، آنرا به آرایه تبدیل کرده و همه چیز مورد نیاز را در ویرایشگر مشاهده می کنید. اپدیت ها: ویژگی های V 1.41: * حالت زنده ماندن جدید * هوش مصنوعی دوستانه (ساده دنبال کنید و در جای خود بایستید). * بارگیری مجدد سریع برای سلاح ها (رویداد به موقع) * رفع چندین اشکال و به روز رسانی 2017.2 * ایجاد prefab آسان کاراکتر * بازوی IK: گرفتن سلاح به روز شده است ویژگی های V 1.3: * چند نفره محلی * صفحه تقسیم اختیاری. * NEW Blood VFX splat. * مرگ راگدول * سلاح های پرتو ویژگی های V 1.2: * اضافه شده برجهای اتوماتیک. * برج دفاع جدید AI. * صحنه نمایش نسخه ی نمایشی جدید برج دفاعی. * بررسی تیم در گلوله ها (آتش دوستانه). ویژگی های V 1.1: * اسلحه غوغا برای بازیکن و دشمنان اضافه شده است. * ابزارهای سازنده دشمن و پخش کننده. ایجاد یک پخش کننده یا دشمن با استفاده از پیشاب دیگر به عنوان مرجع آسان. * سلاح با هدف ساده IK برای اشاره سلاح دقیقا به هدف. * سوئیچ همیشه هدف. برای استفاده از آن در تیراندازهای دو قلو که در آن می خواهید هر وقت شلیک کنید. * Rootmotion و NO rootmotion بهتر کار می کند. * لیست سلاح ها که اکنون در یک متن قابل ویرایش قابل ویرایش است- * سلاح ها اکنون می توانند چندین گلوله را در یک توزیع قوس کامل شلیک کنند ویژگی های V 1.0: این یک پروژه الگوی تیرانداز Top Down است که شامل بسیاری از مؤلفه ها و دارایی ها برای شروع یک بازی تیرانداز از بالا به پایین در ثانیه است. این ساخته شده بسیار ساده و قابل دسترس است تا هرکسی بتواند از آن استفاده کند. پخش کننده نسبتاً به نمای دوربین حرکت می کند و با ماوس یا Right Stick از جوی استیک هدف گذاری می کند. می توانید مواردی را انتخاب کنید ، پانل ها را فعال کنید ، درها را باز یا بسته کنید ، بازیگران را از بین ببرید ، نارنجک پرتاب کنید و ... سلاح ها کاملاً قابل استفاده هستند و پارامترهایی مانند FireRate ، سرعت گلوله ، شتاب ، آسیب واگرایی و ... گلوله ها با یک سیستم استخر ساده مدیریت می شوند. هوش مصنوعی با استفاده از مش های Nav و یک سیستم ایستگاه بین راه ساده کار می کند ، به نظر می رسد پخش کننده شنوایی مانند شلیک یا قدم زدن و استفاده از فاصله و زاویه دید برای کنترل اگر آنها شما را تعقیب کنند ، شما را هدف قرار دهند یا به شما حمله کنند. including: - Soldier character model, with customizable colors by shader. - 2 weapons models (Pistol & Rifle) - Lots of animations. - Complete character controller and Inventory script using mouse & keyboard or Joystick to control the player - Fully customizable Weapons component, where you can change from Fire Rate to Accuracy (divergence) and selectable bullet prefabs, velocity, how many bullets per shot, and more... - Four PREMADE weapons (Pistol, Rifle, Rocket Launcher and Shotgun ;) ) - Player HUD: Health, Stamina, weapons stats (weapon icon, bullets and clips counters) - Objective radar included on player (Direction and distance) - Standard camera Rig following player smootly with parametric camera shakes for shooting and explosions - Pick up actor for items (Weapons, keys, Health or Ammo) - Automatic Doors, lockeables and customizables (singles and doubles) - Interactuable panels to use in different situations, like opening doors, or activating anything in the scene. - Basic Enemy AI using waypoints and Unity Nav Meshes - Enemy Spawner component, with basic parameters like Spawn rate, max spawn, spawn area, and more... - Waypoint route component to create the enemy waypoints routes easily, assigning transforms to an array and visualazing everything you need in the editor updates: V 1.41 Features: * New Survival Mode * Friendly AI (simple follow and stand in place). * Quick Reload for weapons (timed event) * multiple bug fix and update to 2017.2 * Easy character prefab creation * Arm IK: Weapon grip upgraded V 1.3 Features: * Local Multiplayer * optional Split Screen. * NEW Blood VFX splat. * Ragdoll Death * Beam Weapons V 1.2 Features: * Added Automatic Turrets. * NEW Tower Defense AI. * NEW Tower Defense Demo Scene. * Team check in bullets (Friendly Fire). V 1.1 Features: * Added melee weapons for player and enemies. * Enemy & player creator tools. Easy to create a player or an enemy using another prefab as a reference. * Simple aiming weapon IK to point the weapon exactly to the target. * Always aiming switch. To use it in twin stick shooters where you want to be able to shoot any time. * Rootmotion and NO rootmotion working better. * Weapon list now editable in an scriptable object- * Weapons now can shoot multiple bullets in a perfect arch distribution V 1.0 Features: This is a Top Down shooter template project, including a lot of components and assets to start a Top down shooter game in seconds. It`s been made very simple and accesible so anyone can use it. The player moves relatively to the camera view, and aims with the mouse or the Right Stick of the joystick. You can pick up items, activate panels, open or close doors, destroy actors, throw grenades and more... Weapons are fully twekeable, with parameters like FireRate, Bullet speed, acceleration, divergence damage, and so... Bullets are managed with a simple pooling system. AI works using nav meshes and a simple waypoint system, hearing player sounds like shooting or footsteps and using distance and angle of view to control if they chase you, aim you, or attack you. Read the full article
0 notes
codeofelm · 6 years ago
Text
Day 4 - Unity - Tags, Static Objects & Saving
Tags
Let’s get to know tags. Tags are reference words and are useful for things like triggers in collider scripts. 
Tumblr media
The GameObject.FindWithTag() function lets you find a GameObject by setting it to look for any object that contains the Tag you want. The following example from Unity uses the GameObject.FindWithTag(). It instantiates respawnPrefab at the location of GameObjects with the Tag “Respawn”:
using UnityEngine; using System.Collections;
public class Example : MonoBehaviour 
{  public GameObject respawnPrefab;  public GameObject respawn;  void Start() 
    {      if (respawn == null)          respawn = GameObject.FindWithTag("Respawn");
     Instantiate(respawnPrefab, respawn.transform.position,             respawn.transform.rotation) as GameObject;     } }
Creating new tags The inspector shows the Tag and Layer drop down menus just below the gameobjects name. Select Add Tag… Once you name a tag you cannot rename it later. 
A GameObject can only have one tag assigned to it, so you’ll want to be sure what tag you want to use for it. However, there are some built-in tags which will not appear in the tag manager. These are: 
Untagged
Respawn
Finish
EditorOnly
MainCamera
Player
GameController
Static GameObjects
Optimizations need to know if an object can move during game play. Rendering can be optimised by combining several static objects into a single large object known as a batch.
Saving 
Version Control
Tumblr media
You won’t get very far in your project’s development, if you don’t save and save often. Of course, sometimes with this you run into errors you made accidentally a while back, or can’t pinpoint exactly when it all went to shit. This is where you want to preserve incremental changes and also allow rollbacks. 
Some popular version controls are Perforce, Git and Subversion.
I use Perforce everyday in my day job, but I don’t use it at home for these personal projects. This is mostly because it’s still very early stages, and the code/project isn’t really being touched by anyone other than myself currently. 
That keeps things pretty simple. So for now I just utilise Git, Github and Github desktop to work with branches. Git allows you to revert commits/branches, so to some extent is safer than just having the project with no version control.  For the record, Git is an open-source version control, and Github allows you to store your repositories. Github Desktop is a GUI for git, meaning I don’t have to bother with the git command line for a lot of things. 
Maybe in the future I’ll do some posts/breakdowns on Git, Github and Github desktop. Let me know if that interests you! 
In-engine saving
Tumblr media
It might feel a little counter-intuitive, but when you press save scene, you save the changes made to your current scene, but also save changes to the project. 
However, when you choose save project, you save project wide changes but not changes made to your scene.  
What project wide changes would you save? Well, these are the things you find within your project settings. 
Input is saved as ´InputManager.asset´
Tags and Layers are saved as ´TagManager.asset´
Audio is saved as ´AudioManager.asset´
Time is saved as ´TimeManager.asset´
Player is saved as ´ProjectSettings.asset´
Physics is saved as ´DynamicsManager.asset´
Physics 2D is saved as ´Physics2DSettings.asset´
Quality is saved as ´QualitySettings.asset´
Graphics is saved as ´GraphicsSettings.asset´
Network is saved as ´NetworkManager.asset´
Editor is saved as ´EditorUserSettings.asset´
Build settings are also saved in the library folder as ‘EditorBuildSettings.asset’. Changes to the project window (if they don’t have an apply button) are also saved such as material parameters, prefabs, animator controllers/state machines, avatar masks and any other non apply button changes.
Some changes are are immediately written to disk. This includes change the texture type of an image asset, the scale of a 3D model, the compression settings of an audio asset and any import setting that has an apply button. 
Changes that are immediately written to disk are.
Changing the texture type of an image asset
Changing the scale factor of an 3D model asset
Changing the compression settings of an audio asset
Any other import setting change which has an “apply” button
Other changes which are saved immediately: A few other types of data are saved to disk immediately or automatically without the need to perform a “Save” action:
The creation of new assets, eg: new materials or prefabs (But not subsequent changes to those assets)
Baked Lighting data (saved when the bake completes).
Baked navigation data (saved when the bake completes)
Baked occlusion culling
data (saved when the bake completes)
Script execution order changes (after “apply” is pressed, this data is saved in each script’s .meta file)
That concludes today’s post! If you have any questions about saving, version control, tags, or static objects! Let me know, and I’ll break it down some more in a future post. 
Thanks, for reading. 
0 notes